-
Notifications
You must be signed in to change notification settings - Fork 352
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Composition Support to LoRA and (IA)³ #598
Conversation
Regarding the refactoring, I did a run with the backwards compatibility tests and they pass ✅ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. I have just a few comments
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks good to me; thanks for working on this! I only left some minor comments.
Co-authored-by: TimoImhof <[email protected]>
Follow-up to #591.
This PR provides initial support for adapter composition in LoRA & (IA)³ modules. Currently LoRA & (IA)³ don't support composition. With this PR, the following blocks will be supported: Stack, BatchSplit, Average, Parallel
Additionally, the LoRA implementation is refactored a bit in an effort to make it cleaner.
Limitations
LoRAMergedLinear
implementation. These currently are: GPT-2, DeBERTa (v1)TODO